Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
In many high-impact applications, it is important to ensure the quality of the output of a machine learning algorithm as well as its reliability in comparison to the complexity of the algorithm used. In this paper, we have initiated a mathematically rigorous theory to decide which models (algorithms applied on data sets) are close to each other in terms of certain metrics, such as performance and the complexity level of the algorithm. This involves creating a grid on the hypothetical spaces of data sets and algorithms so as to identify a finite set of probability distributions from which the data sets are sampled and a finite set of algorithms. A given threshold metric acting on this grid will express the nearness (or statistical distance) of each algorithm and data set of interest to any given application. A technically difficult part of this project is to estimate the so-called metric entropy of a compact subset of functions of \textbf{infinitely many variables} that arise in the definition of these spaces.more » « less
-
In this paper, we present a sharper version of the results in the paper Dimension independent bounds for general shallow networks; Neural Networks, \textbf{123} (2020), 142-152. Let $$\mathbb{X}$$ and $$\mathbb{Y}$$ be compact metric spaces. We consider approximation of functions of the form $$ x\mapsto\int_{\mathbb{Y}} G( x, y)d\tau( y)$$, $$ x\in\mathbb{X}$$, by $$G$$-networks of the form $$ x\mapsto \sum_{k=1}^n a_kG( x, y_k)$$, $$ y_1,\cdots, y_n\in\mathbb{Y}$$, $$a_1,\cdots, a_n\in\mathbb{R}$$. Defining the dimensions of $$\mathbb{X}$$ and $$\mathbb{Y}$$ in terms of covering numbers, we obtain dimension independent bounds on the degree of approximation in terms of $$n$$, where also the constants involved are all dependent at most polynomially on the dimensions. Applications include approximation by power rectified linear unit networks, zonal function networks, certain radial basis function networks as well as the important problem of function extension to higher dimensional spaces.more » « less
-
In this work, we develop numerical methods to solve forward and inverse wave problems for a nonlinear Helmholtz equation defined in a spherical shell between two concentric spheres centred at the origin. A spectral method is developed to solve the forward problem while a combination of a finite difference approximation and the least squares method are derived for the inverse problem. Numerical examples are given to verify the method. ReferencesR. Askey. Orthogonal polynomials and special functions. CBMS-NSF Regional Conference Series in Applied Mathematics. SIAM, 1975. doi: 10.1137/1.9781611970470G. Baruch, G. Fibich, and S. Tsynkov. High-order numerical method for the nonlinear Helmholtz equation with material discontinuities in one space dimension. Nonlinear Photonics. Optica Publishing Group, 2007. doi: 10.1364/np.2007.ntha6G. Fibich and S. Tsynkov. High-Order Two-Way Artificial Boundary Conditions for Nonlinear Wave Propagation with Backscattering. J. Comput. Phys. 171 (2001), pp. 632–677. doi: 10.1006/jcph.2001.6800G. Fibich and S. Tsynkov. Numerical solution of the nonlinear Helmholtz equation using nonorthogonal expansions. J. Comput. Phys. 210 (2005), pp. 183–224. doi: 10.1016/j.jcp.2005.04.015P. M. Morse and K. U. Ingard. Theoretical Acoustics. International Series in Pure and Applied Physics. McGraw-Hill Book Company, 1968G. N. Watson. A treatise on the theory of Bessel functions. International Series in Pure and Applied Physics. Cambridge Mathematical Library, 1996. url: https://www.cambridge.org/au/universitypress/subjects/mathematics/real-and-complex-analysis/treatise-theory-bessel-functions-2nd-edition-1?format=PB&isbn=9780521483919more » « less
-
This paper introduces kdiff, a novel kernel-based measure for estimating distances between instances of time series, random fields and other forms of structured data. This measure is based on the idea of matching distributions that only overlap over a portion of their region of support. Our proposed measure is inspired by MPdist which has been previously proposed for such datasets and is constructed using Euclidean metrics, whereas kdiff is constructed using non-linear kernel distances. Also, kdiff accounts for both self and cross similarities across the instances and is defined using a lower quantile of the distance distribution. Comparing the cross similarity to self similarity allows for measures of similarity that are more robust to noise and partial occlusions of the relevant signals. Our proposed measure kdiff is a more general form of the well known kernel-based Maximum Mean Discrepancy distance estimated over the embeddings. Some theoretical results are provided for separability conditions using kdiff as a distance measure for clustering and classification problems where the embedding distributions can be modeled as two component mixtures. Applications are demonstrated for clustering of synthetic and real-life time series and image data, and the performance of kdiff is compared to competing distance measures for clustering.more » « less
An official website of the United States government
